Minimum Mutual Information and Non-Gaussianity through the Maximum Entropy Method: Estimation from Finite Samples

نویسندگان

  • Carlos A. L. Pires
  • Rui A. P. Perdigão
چکیده

The Minimum Mutual Information (MinMI) Principle provides the least committed, maximum-joint-entropy (ME) inferential law that is compatible with prescribed marginal distributions and empirical cross constraints. Here, we estimate MI bounds (the MinMI values) generated by constraining sets Tcr comprehended by mcr linear and/or nonlinear joint expectations, computed from samples of N iid outcomes. Marginals (and their entropy) are imposed by single morphisms of the original random variables. N-asymptotic formulas are given both for the distribution of cross expectation’s estimation errors, the MinMI estimation bias, its variance and distribution. A growing Tcr leads to an increasing MinMI, converging eventually to the total MI. Under N-sized samples, the MinMI increment relative to two encapsulated sets Tcr1  Tcr2 (with numbers of constraints 1 2 cr cr m m  ) is the test-difference max1, max2, 0 N N H H H     between the two respective estimated MEs. Asymptotically, H follows a Chi-Squared distribution 2 1 2 1 ( ) 2 cr cr m m N   whose upper quantiles determine if constraints in Tcr2/Tcr1 explain significant extra MI. As an example, we have set marginals to being normally distributed (Gaussian) and have built a sequence of MI bounds, associated to successive non-linear correlations due to joint non-Gaussianity. Noting that in real-world situations available sample sizes can be rather low, the relationship between MinMI bias, probability density over-fitting and outliers is put in evidence for under-sampled data. OPEN ACCESS Entropy 2013, 15 722

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties

The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaus...

متن کامل

Limitations of state estimation: absolute lower bound of minimum variance estimation/filtering, Gaussianity-whiteness measure (joint Shannon-Wiener entropy), and Gaussianing-whitening filter (maximum Gaussianity-whiteness measure principle)

This paper aims at obtaining performance limitations of state estimation in terms of variance minimization (minimum variance estimation and filtering) using information theory. Two new notions, negentropy rate and Gaussianity-whiteness measure (joint Shannon-Wiener entropy), are proposed to facilitate the analysis. Topics such as Gaussianing-whitening filter (the maximum Gaussianity-whiteness m...

متن کامل

Quadratic Mutual Information Feature Selection

We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussian kernel functions, and can detect second order non-linear relations. Its main advantages are: (i) unified analysis of discrete and continuous dat...

متن کامل

Minimum Entropy Estimation of Hierarchical Random Graph Parameters for Character Recognition

2.1 Entropy and mutual information In this paper, we propose a new parameter estimation method called minimum entropy estimation (MEE), which tries to minimize the conditional entropy of the models given the input data. Since there is no assumption in MEE for the correctness of the parameter space of models, MEE will perform not less than the other estimation methods such as maximum likelihood ...

متن کامل

Minimax Mutual Information Approach for Independent Component Analysis

Minimum output mutual information is regarded as a natural criterion for independent component analysis (ICA) and is used as the performance measure in many ICA algorithms. Two common approaches in information-theoretic ICA algorithms are minimum mutual information and maximum output entropy approaches. In the former approach, we substitute some form of probability density function (pdf) estima...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Entropy

دوره 15  شماره 

صفحات  -

تاریخ انتشار 2013